Partial knowledge, entropy, and estimation
نویسندگان
چکیده
منابع مشابه
Partial Solution and Entropy
If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy H(S) for uncertainty in partially solved input data S(X) = (X1, ..., Xk), where X is the entire data set, and each Xi is already solved. We use the entropy measure to analyze three example problems, sorting, shortest pat...
متن کاملOn the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملDischarge Estimation by using Tsallis Entropy Concept
Flow-rate measurement in rivers under different conditions is required for river management purposes including water resources planning, pollution prevention, and flood control. This study proposed a new discharge estimation method by using a mean velocity derived from a 2D velocity distribution formula based on Tsallis entropy concept. This procedure is done based on several factors which refl...
متن کاملPartial Knowledge in Multiple-Choice Testing
The intent of this study was to discover the nature of (partial) knowledge as estimated by the multiple-choice (MC) test method. An MC test of vocabulary, including 20 items, was given to 10 participants. Each examinee was required to think aloud while focusing on each item before and while making a response. After each test taker was done with each item, s/he was ...
متن کاملEstimation Entropy and Optimal Observability
We consider a pair of correlated processes {Zn}n=−∞ and {Sn}n=−∞, where the former is observable and the latter is hidden. The uncertainty in the estimation of Sn upon the finite past history of Zn−1 0 is H(Sn|Z n−1 0 ) which is a sequence of n. The limit of Cesaro mean of this sequence is called the estimation entropy. We show that the estimation entropy is the long run average entropy of the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the National Academy of Sciences
سال: 1975
ISSN: 0027-8424,1091-6490
DOI: 10.1073/pnas.72.10.3819